Capacity, mutual information, and coding for finite-state Markov channels

نویسندگان

  • Andrea J. Goldsmith
  • Pravin Varaiya
چکیده

The Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the channel’s mutual information is then a closed-form continuous function of the input distribution. We next consider coding for FSMC’s. In general, the complexity of maximum-likelihood decoding grows exponentially with the channel memory length. Therefore, in practice, interleaving and memoryless channel codes are used. This technique results in some performance loss relative to the inherent capacity of channels with memory. We propose a maximum-likelihood decision-feedback decoder with complexity that is independent of the channel memory. We calculate the capacity and cutoff rate of our technique, and show that it preserves the capacity of certain FSMC’s. We also compare the performance of the decision-feedback decoder with that of interleaving and memoryless channel coding on a fading channel with 4PSK modulation. Index Terns-Finite-state Markov channels, capacity, mutual information, decision-feedback maximum-likelihood decoding.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Capacity, Mutual Information, and Coding for Finite-State Markov Channels - Information Theory, IEEE Transactions on

AbstructThe Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the...

متن کامل

On Entropy and Lyapunov Exponents for Finite-State Channels

The Finite-State Markov Channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this ch...

متن کامل

Entropy and Mutual Information for Markov Channels with General Inputs

We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...

متن کامل

Capacity of Finite State Markov Channels with General Inputs

We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...

متن کامل

A Generalized Blahut-Arimoto Algorithm

Kavčić proposed in [1] an algorithm that optimizes the parameters of a Markov source at the input to a finite-state machine channel in order to maximize the mutual information rate. Numerical results for several channels indicated that his algorithm gives capacity-achieving input distributions. In this paper we prove that the stationary points of this algorithm indeed correspond one-to-one to t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 42  شماره 

صفحات  -

تاریخ انتشار 1996